Local Rademacher complexity bounds based on covering numbers

نویسندگان

  • Yunwen Lei
  • Lixin Ding
  • Yingzhou Bi
چکیده

This paper provides a general result on controlling local Rademacher complexities, which captures in an elegant form to relate the complexities with constraint on the expected norm to the corresponding ones with constraint on the empirical norm. This result is convenient to apply in real applications and could yield refined local Rademacher complexity bounds for function classes satisfying general entropy conditions. We demonstrate the power of our complexity bounds by applying them to derive effective generalization error bounds.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Error Bounds for Piecewise Smooth and Switching Regression

The paper deals with regression problems, in which the nonsmooth target is assumed to switch between different operating modes. Specifically, piecewise smooth (PWS) regression considers target functions switching deterministically via a partition of the input space, while switching regression considers arbitrary switching laws. The paper derives generalization error bounds in these two settings...

متن کامل

Covering and Rademacher bounds for neural networks

This lecture is the first of three lectures investigating the complexity of neural networks. We’ll cover 3 bounds, 2 of which use covering numbers, and at the end point show how to relate them to Rademacher complexity. The definition of cover we’ll use is as follows. Definition. Say G is a (‖ · ‖p; , S)-cover of F if: • For every f ∈ F , there exists g ∈ G so that ‖g(S)− f(S)‖p ≤ , where g(S) :...

متن کامل

Bounds in Terms of Rademacher Averages

So far we have seen how to obtain generalization error bounds for learning algorithms that pick a function from a function class of limited capacity or complexity, where the complexity of the class is measured using the growth function or VC-dimension in the binary case, and using covering numbers or the fatshattering dimension in the real-valued case. These complexity measures however do not t...

متن کامل

On the Complexity of Linear Prediction: Risk Bounds, Margin Bounds, and Regularization

This work characterizes the generalization ability of algorithms whose predictions are linear in the input vector. To this end, we provide sharp bounds for Rademacher and Gaussian complexities of (constrained) linear classes, which directly lead to a number of generalization bounds. This derivation provides simplified proofs of a number of corollaries including: risk bounds for linear predictio...

متن کامل

Local Rademacher Complexities

We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 218  شماره 

صفحات  -

تاریخ انتشار 2016